Searching for minimal optimal neural networks

نویسندگان

چکیده

Large neural network models have high predictive power but may suffer from overfitting if the training set is not large enough. Therefore, it desirable to select an appropriate size for networks. The destructive approach, which starts with a architecture and then reduces using Lasso-type penalty, has been used extensively this task. Despite its popularity, there no theoretical guarantee technique. Based on notion of minimal networks, we posit rigorous mathematical framework studying asymptotic theory We prove that Adaptive group Lasso consistent can reconstruct correct number hidden nodes one-hidden-layer feedforward networks probability. To best our knowledge, first result establishing

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Searching for optimal size neural networks

Assembler Encoding represents a neural network in the form of a simple program called Assembler Encoding Program. The task of the program is to create the so-called Network Definition Matrix, which maintains all the information necessary to construct a network. To generate the programs and, in consequence, neural networks, evolutionary techniques are used. One of the problems in Assembler Encod...

متن کامل

Automatically searching near-optimal artificial neural networks

The idea of automatically searching neural networks that learn faster and generalize better is becoming increasingly widespread. In this paper, we present a new method for searching near-optimal artificial neural networks that include initial weights, transfer functions, architectures and learning rules that are specially tailored to a given problem. Experimental results have shown that the met...

متن کامل

Searching the Sky for Neural Networks

Sky computing is a new computing paradigm leveraging resources of multiple Cloud providers to create a large-scale distributed infrastructure. N2Sky is a research initiative promising a framework for the utilization of Neural Networks as services across many Clouds integrating into a Sky. This involves a number of challenges ranging from the provision, discovery and utilization of N2Sky service...

متن کامل

A DSS-Based Dynamic Programming for Finding Optimal Markets Using Neural Networks and Pricing

One of the substantial challenges in marketing efforts is determining optimal markets, specifically in market segmentation. The problem is more controversial in electronic commerce and electronic marketing. Consumer behaviour is influenced by different factors and thus varies in different time periods. These dynamic impacts lead to the uncertain behaviour of consumers and therefore harden the t...

متن کامل

Minimal Gated Unit for Recurrent Neural Networks

Recently recurrent neural networks (RNN) has been very successful in handling sequence data. However, understanding RNN and finding the best practices for RNN is a difficult task, partly because there are many competing and complex hidden units (such as LSTM and GRU). We propose a gated unit for RNN, named as Minimal Gated Unit (MGU), since it only contains one gate, which is a minimal design a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Statistics & Probability Letters

سال: 2022

ISSN: ['1879-2103', '0167-7152']

DOI: https://doi.org/10.1016/j.spl.2021.109353